Twitter Reply Bot#
Ever see those Twitter bots that reply to comments automatically? Like this one or this one
Let’s create one ourselves. In this notebook we’ll just look at the prompting technique I used. In the full code you’ll see the other helper code to deploy this app.
Here’s how the final app will work
A user @mentions your bot, for me it will be @SiliconOracle
The script finds that new @mention and then reads the parent tweet it is being mentioned on
The script takes that parent tweet and generates a witty response using a language model
Respond is posted and tweet is logged
This notebook will focus on #3.
First let’s import our packages
# Unzip data folder
import zipfile
with zipfile.ZipFile('../../data.zip', 'r') as zip_ref:
zip_ref.extractall('..')
from langchain.chat_models import ChatOpenAI
from langchain.prompts import ChatPromptTemplate, PromptTemplate, SystemMessagePromptTemplate, AIMessagePromptTemplate, HumanMessagePromptTemplate
from dotenv import load_dotenv
import os
load_dotenv()
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY", "YourKey")
Then let’s create our LLM, you should experiment with a larger ‘temperature’ since this is a creative task
llm = ChatOpenAI(temperature=0.3,
openai_api_key=OPENAI_API_KEY,
# model_name='gpt-3.5-turbo',
model_name='gpt-4',
)
Then let’s create our function that will take in a piece of a text (a tweet) and give us an output response
You are an incredibly wise and smart tech mad scientist from silicon valley. Your goal is to give a concise prediction in response to a piece of text from the user.
Your prediction should be given in an active voice and be opinionated
Your tone should be serious w/ a hint of wit and sarcasm
Respond in under 200 characters
Respond in one short sentence
Do not respond with emojis
Include specific examples of old tech if they are relevant
If you don’t have an answer, say, “Sorry, my magic 8 ball isn’t working right now 🔮”
def generate_response(llm, mentioned_parent_tweet_text):
# It would be nice to bring in information about the links, pictures, etc.
# But out of scope for now
system_template = """
You are an incredibly wise and smart tech mad scientist from silicon valley.
Your goal is to give a concise prediction in response to a piece of text from the user.
% RESPONSE TONE:
- Your prediction should be given in an active voice and be opinionated
- Your tone should be serious w/ a hint of wit and sarcasm
% RESPONSE FORMAT:
- Respond in under 200 characters
- Respond in two or less short sentences
- Do not respond with emojis
% RESPONSE CONTENT:
- Include specific examples of old tech if they are relevant
- If you don't have an answer, say, "Sorry, my magic 8 ball isn't working right now 🔮"
"""
system_message_prompt = SystemMessagePromptTemplate.from_template(system_template)
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
# get a chat completion from the formatted messages
final_prompt = chat_prompt.format_prompt(text=mentioned_parent_tweet_text).to_messages()
response = llm(final_prompt).content
return response
tweet = """
I wanted to build a sassy Twitter Bot that responded about the 'good ole days' of tech
@SiliconOracle was built using @LangChainAI and hosted on @railway
Condensed Prompt:
You are a mad scientist from old school silicon valley that makes predictions on the future of a tweet
"""
response = generate_response(llm, tweet)
print (response)
Ah, a Twitter bot reminiscing about the days of dial-up and floppy disks. It'll surely go viral, just like MySpace.
Awesome, now that we have a prompt that we can respond to a tweet with, let’s move onto deploying this code.
Check out the full code here